7 research outputs found

    Robust stability theory for stochastic dynamical systems

    Get PDF
    In this work, we focus on developing analysis tools related to stability theory forcertain classes of stochastic dynamical systems that permit non-unique solutions. Thenon-unique nature of solutions arise primarily due to the system dynamics that aremodeled by set-valued mappings. There are two main motivations for studying suchclasses of systems. Firstly, understanding such systems is crucial to developing a robuststability theory. Secondly, such system models allow flexibility in control design problems.We begin by developing analysis tools for a simple class of discrete-time stochasticsystem modeled by set-valued maps and then extend the results to a larger class ofstochastic hybrid systems. Stochastic hybrid systems are a class of dynamical systemsthat combine continuous-time dynamics, discrete-time dynamics and randomness. Theanalysis tools are established for properties like global asymptotic stability in probabilityand global recurrence. We focus on establishing results related to sufficient conditions for stability, weak sufficient conditions for stability, robust stability conditions and converse Lyapunov theorems. In this work a primary assumption is that the stochastic system satisfies some mild regularity properties with respect to the state variable and random input. The regularity properties are needed to establish the existence of random solutions and results on sequential compactness for the solution set of the stochastic system.We now explain briefly the four main types of analysis tools studied in this work.Sufficient conditions for stability establish conditions involving Lyapunov-like functionssatisfying strict decrease properties along solutions that are needed to verify stability properties. Weak sufficient conditions relax the strict decrease nature of the Lyapunov like function along solutions and rely on either knowledge about the behavior of thesolutions on certain level sets of the Lyapunov-like function or use multiple nested non-strict Lyapunov-like functions to conclude stability properties. The invariance principleand Matrosov function theory fall in to this category. Robust stability conditions determinewhen stability properties are robust to sufficiently small perturbations of thenominal system data. Robustness of stability is an important concept in the presenceof measurement errors, disturbances and parametric uncertainty for the nominal system.We study two approaches to verify robustness. The first approach to establish robustnessrelies on the regularity properties of the system data and the second approach isthrough the use of Lyapunov functions. Robustness analysis is an area where the notionof set-valued dynamical systems arise naturally and it emphasizes the reason for ourstudy of such systems. Finally, we focus on developing converse Lyapunov theorems forstochastic systems. Converse Lyapunov theorems are used to illustrate the equivalencebetween asymptotic properties of a system and the existence of a function that satisfiesa decrease condition along the solutions. Strong forms of the converse theorem implythe existence of smooth Lyapunov functions. A fundamental way in which our resultsdiffer from the results in the literature on converse theorems for stochastic systems isthat we exploit robustness of the stability property to establish the existence of a smoothLyapunov function

    Recurrence Principles and Their Application to Stability Theory for a Class of Stochastic Hybrid Systems

    No full text

    Discrete-time stochastic control systems: Examples of robustness to strictly causal perturbations

    No full text
    Robust stability for discrete-time stochastic systems employing possibly discontinuous control laws is the focus of this paper. Through novel examples, we illustrate the fact that the existence of a continuous stochastic Lyapunov function implies robustness of stability to sufficiently small, state-dependent, strictly causal, worst-case perturbations, irrespective to the continuity of the stabilizing control law. We emphasize the role of strict causality through an example for which a continuous stochastic Lyapunov function is not sufficient for robustness to arbitrarily small worst-case perturbations, which are not strictly causal. Finally we illustrate our main result for a stochastic control system which admits a continuous Lyapunov function, but associated with a necessarily discontinuous control law

    Discrete-time stochastic control systems: A continuous Lyapunov function implies robustness to strictly causal perturbations

    No full text
    Discrete-time stochastic systems employing possibly discontinuous state-feedback control laws are addressed. Allowing discontinuous feedbacks is fundamental for stochastic systems regulated, for instance, by optimization-based control laws. We introduce generalized random solutions for discontinuous stochastic systems to guarantee the existence of solutions and to generate enough solutions to get an accurate picture of robustness with respect to strictly causal perturbations. Under basic regularity conditions, the existence of a continuous stochastic Lyapunov function is sufficient to establish that asymptotic stability in probability for the closed-loop system is robust to sufficiently small, state-dependent, strictly causal, worst-case perturbations. Robustness of a weaker stochastic stability property called recurrence is also shown in a global sense in the case of state-dependent perturbations, and in a semiglobal practical sense in the case of persistent perturbations. An example shows that a continuous stochastic Lyapunov function is not sufficient for robustness to arbitrarily small worst-case disturbances that are not strictly causal. Our positive results are also illustrated by examples
    corecore